Augmented Lagrangian and Alternating Direction Methods for Convex Optimization: A Tutorial and Some Illustrative Computational Results
نویسنده
چکیده
The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. This chapter aims to provide an accessible introduction to the analytical underpinnings of the method, which are often obscured in treatments that do not assume knowledge of convex and set-valued analysis. In particular, it is tempting to view the method as an approximate version of the classical augmented Lagrangian algorithm, using one pass of block coordinate minimization to approximately minimize the augmented Lagrangian at each iteration. This chapter, assuming as little prior knowledge of convex analysis as possible, shows that the actual convergence mechanism of the algorithm is quite different, and then underscores this observations with some new computational results in which we compare the ADMM to algorithms that do indeed work by approximately minimizing the augmented Lagrangian. Acknowledgements: This material is based in part upon work supported by the National Science Foundation under Grant CCF-1115638. The author also thanks his student Wang Yao for his work on the computational experiments.Please insert the acknowledgement here.
منابع مشابه
Understanding the Convergence of the Alternating Direction Method of Multipliers: Theoretical and Computational Perspectives
The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. While it is easiest to describe th...
متن کاملConvex Optimization with ALADIN
This paper presents novel convergence results for the Augmented Lagrangian based Alternating Direction Inexact Newton method (ALADIN) in the context of distributed convex optimization. It is shown that ALADIN converges for a large class of convex optimization problems from any starting point to minimizers without needing line-search or other globalization routines. Under additional regularity a...
متن کاملAugmented Lagrangian method for solving absolute value equation and its application in two-point boundary value problems
One of the most important topic that consider in recent years by researcher is absolute value equation (AVE). The absolute value equation seems to be a useful tool in optimization since it subsumes the linear complementarity problem and thus also linear programming and convex quadratic programming. This paper introduce a new method for solving absolute value equation. To do this, we transform a...
متن کاملLinearized augmented Lagrangian and alternating direction methods for nuclear norm minimization
The nuclear norm is widely used to induce low-rank solutions for many optimization problems with matrix variables. Recently, it has been shown that the augmented Lagrangian method (ALM) and the alternating direction method (ADM) are very efficient for many convex programming problems arising from various applications, provided that the resulting subproblems are sufficiently simple to have close...
متن کاملAlternating Direction Method of Multipliers for Real and Complex Polynomial Optimization Models
In this paper, we propose a new method for polynomial optimization with real or complex decision variables. The main ingredient of the approach is to apply the classical alternating direction method of multipliers (ADMM) based on the augmented Lagrangian function. In this particular case, this allows us to fully exploit the multi-block structure of the polynomial functions, even though the opti...
متن کامل